A Penalization Criterion Based on Noise Behaviour for Model Selection

نویسندگان

  • Joaquín Pizarro Junquera
  • Pedro Galindo Riaño
  • Elisa Guerrero Vázquez
  • Andrés Yáñez Escolano
چکیده

Complexity-penalization strategies are one way to decide on the most appropriate network size in order to address the trade-off between overfitted and underfitted models. In this paper we propose a new penalty term derived from the behaviour of candidate models under noisy conditions that seems to be much more robust against catastrophic overfitting errors that standard techniques. This strategy is applied to several regression problems using polynomial functions, univariate autoregressive models and RBF neural networks. The simulation study at the end of the paper will show that the proposed criterion is extremely competitive when compared to state-of-the-art criteria.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Noise derived information criterion for model selection

. This paper proposes a new complexity-penalization model selection strategy derived from the minimum risk principle and the behavior of candidate models under noisy conditions. This strategy seems to be robust in small sample size conditions and tends to AIC criterion as sample size grows up. The simulation study at the end of the paper will show that the proposed criterion is extremely compet...

متن کامل

Bridging Information Criteria and Parameter Shrinkage for Model Selection

Model selection based on classical information criteria, such as BIC, is generally computationally demanding, but its properties are well studied. On the other hand, model selection based on parameter shrinkage by l1-type penalties is computationally efficient. In this paper we make an attempt to combine their strengths, and propose a simple approach that penalizes the likelihood with data-depe...

متن کامل

Model selection by resampling penalization

We present a new family of model selection algorithms based on the resampling heuristics. It can be used in several frameworks, do not require any knowledge about the unknown law of the data, and may be seen as a generalization of local Rademacher complexities and V fold cross-validation. In the case example of least-square regression on histograms, we prove oracle inequalities, and that these ...

متن کامل

Bayesian-driven criterion to automatically select the regularization parameter in the ℓ1-Potts model

This contribution focuses, within the `1-Potts model, on the automated estimation of the regularization parameter balancing the `1 data fidelity term and the TV`0 penalization. Variational approaches based on total variation gained considerable interest to solve piecewise constant denoising problems thanks to their deterministic setting and low computational cost. However, the quality of the ac...

متن کامل

Model Selection in Linear Mixed Models Using Mdl Criterion with an Application to Spline Smoothing

For spline smoothing one can rewrite the smooth estimation as a linear mixed model (LMM) where the smoothing parameter appears as the variance of spline basis coefficients. Smoothing methods that use basis functions with penalization can utilize maximum likelihood (ML) theory in LMM framework ([8]). We introduce the minimum description length (MDL) model selection criterion in LMM and propose a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001